![]() NAVIGATION SYSTEM FOR A UNDERGROUND AERIAL VEHICLE IN AN INTERIOR ENCLOSURE (Machine-translation by
专利摘要:
A navigation system for an unmanned aerial vehicle in an indoor enclosure. It has a positioning module (140) with an inertial unit (116) on board and a memory (105) for storing the map or the physical characteristics of the site. A plurality of distance sensors (110, 112, 114) and a laser scanner (102) measure the 2D position (x, y, z) and the yaw of the aerial vehicle. By means of a comparison with a map corresponding to the internal storage room (105), the estimated values of 3D position and yaw of the vehicle are combined with the data provided by the inertial unit (116) to establish corrections to the pre-established route and act on the flight engines of said vehicle. 公开号:ES2684847A1 申请号:ES201730538 申请日:2017-03-31 公开日:2018-10-04 发明作者:David TRILLO PÉREZ;José Ignacio LAMAS FONTE 申请人:Avansig S L L;Avansig Sll; IPC主号:
专利说明:
Technical Field of the Invention The invention pertains to guidance, navigation and surveillance systems. In particular, it refers to an indoor navigation system for drones capable of overcoming obstacles and carrying out missions autonomously. State of the Art Currently, unmanned aerial drains or vehicles (in English "UAV" by "Unmanned Aerial Vehicle") are increasingly used in more areas. Among other uses, they are quite widespread in precision agriculture, photography, video filming, aerial cartography and various other applications that require extensive aerial surveillance. However, conventionally, the drains use the GPS positioning system that has an accuracy of the order of 1 meter. They are usually directed remotely by humans from the ground. Although it would be advantageous if the drains could operate autonomously, not only outdoors, but also inside buildings and constructions. Providing them with this capacity is a technological challenge and requires taking into account that GPS coverage tends to fail indoors. In addition, many industrial environments are just interior. For example, most factories have covered enclosures such as warehouses to store merchandise, process plants, etc. On the other hand, the regulations in force do not consider the interior of buildings part of the airspace. Therefore, a drain could function legally without requiring a pilot (something that is required outdoors). This can be especially useful in surveillance and supervision missions in dangerous or poorly accessible environments. To achieve this objective, drains must incorporate systems capable of perceiving and responding to their local environment, altering their flight path in order to avoid colliding with other objects that cross their path. As mentioned, navigation systems usually available for drones are not suitable for indoor applications, where an accuracy of the order of cm is necessary and coverage is not always ensured. Although there are some proposals for navigation systems for indoor robots, they are mainly based on artificial vision systems and readings of 5 2D or 3D laser scanner. An associated problem that such proposals have is the excessive computational load they require. For a drone, the limitations of capacity, memory and energy of the on-board processor have to be taken into account, which causes in practice that these types of proposals are inadequate. 10 Other known technological alternatives in navigation systems that include obstacle detection and raffle are based on odometry and INS inertial sensors, so that they are autonomous (no external references needed). They estimate the relative position using drone information in two consecutive moments. Therefore, they induce accumulation of errors in long sections. Additionally, in areas close to tension lines 15 or steel structures, these proposals present failures and errors, so they are not safe. Another current technique is based on beacons. This technique is perhaps the most widespread in other areas such as the navigation of ships, airplanes or other robots. The 3D position is made with respect to the position of beacons whose position is known and fixed. These 20 references must be stored in memory and the scenario must be previously marked. Among the disadvantages of this technique are its high cost of installation and maintenance along with a different distribution for each navigation scenario that makes it not very flexible. Finally, mention artificial vision techniques based on the treatment of 25 images captured by a camera. These techniques can also be used for indoor and outdoor navigation. Such artificial vision techniques suffer from the inconvenience of relying on computationally complex algorithms that require high processing power and offer high execution times. This feature causes the response speed of the aerial vehicle during the 30 navigation is significantly reduced, resulting in slow and inaccurate systems. Brief Description of the Invention In view of the limitations of the state of the art, it would be desirable for them to have a precision, autonomous navigation system that is less complex to deploy and requires less computational resources in execution. The proposed system is based on using a map of the interior enclosure to help the tasks of 5 guided. In the event that the operational environment develops within buildings, the information obtainable from the plans associated with such buildings can be used. That is, in addition to the measurements taken by the sensors, a map stored in memory with the area of operation is used. This technical solution has the advantage that it determines the position of the drone in the 10 work scenario without modifications or calibration. Reference mapping therefore simplifies the positioning problem and reduces computational complexity. As mentioned earlier, learning maps 20 or 3D (SLAM) involves a large processing load. The present invention is generally defined as a navigation system 15 for an unmanned aerial vehicle in an indoor enclosure that incorporates a positioning module that has an on-board inertial unit and a memory. The positioning module also includes a plurality of distance sensors and a laser scanner, where said positioning module allows to estimate in the horizontal plane the position 20 (x, y) and yaw of the aerial vehicle, by means of a sweep 20 laser performed by the laser scanner and a comparison with a map corresponding to the interior enclosure stored in memory (105). The positioning module also allows you to estimate the vertical distance of the vehicle with both the roof and the ground, or with a possible obstacle either above or below the vehicle thanks to the distance sensors. Said positioning module allows you to combine the values 25 estimates of 3D position and yaw of the vehicle with the data provided by the inertial unit and thus establish corrections to the preset route. Optionally, the positioning module analyzes the data obtained by the laser scanner by applying the Sean Matehing algorithm. Optionally, the positioning module includes a first estimator for 30 analyze the data obtained by the laser scanner and the data corresponding to the site map stored in memory. To do this, it implements the Monte Carlo algorithm and obtains an estimate of position 20 (x, y) and yaw of the aerial vehicle. Optionally, the positioning module can update (for example with 35 obstacles, furniture, etc. found in the route followed) in turn the map stored in the memory corresponding to the interior enclosure based on the data obtained through the distance sensors and the laser scanner_ Thus it is possible to achieve an increasingly precise navigation with less load of computing. Optionally, the positioning module incorporates a second estimator to analyze the data obtained by a pair of distance sensors to estimate 5 redundantly forms the distance of the aerial vehicle to the ground.Optionally, the positioning module also incorporates a Kalman filter forto combine the estimated values of 3D position and yaw of the vehicle with theAcceleration and air vehicle speed data provided by the unitinertial 10 Optionally, the system also includes a planning module to receive corrections to the established route of the positioning module and modify the trajectory of the air vehicle. Optionally, the module the planning module also includes a flight control and execution unit configured to receive orders to modify the 15 preset trajectory and act on the engines of the aerial vehicle. Optionally, the positioning module also includes a 3D artificial vision camera. Additionally, the present invention also relates to an unmanned aerial vehicle that includes the navigation system in a mentioned indoor enclosure. Brief description of the figures FIG. 1 shows a simplified block diagram according to one embodiment. FIG. 2 schematically shows the proposed cyclic navigation process. Detailed description of the invention For a better understanding of the scope of the invention, an example of a drainage navigation system is described for explanatory and explanatory purposes. 30 allows flight safely and securely in enclosed spaces. For this, the system is equipped with the ability to detect and overcome obstacles by itself. The present embodiment example should not be considered limiting. A general objective of the realization of the invention is to have the ability to move to a specific point or travel assigned routes within 35 buildings autonomously and with high precision avoiding unforeseen obstacles and bypassing them. The routes are defined by a series of waypoints (WP for "waypoints"; their name in English) to be followed, and they are previously loaded into memory just like the building plan. Advantageously, the drones provided with this system are capable to travel routes mechanically following a schedule 5 preset, or requested on demand. During the flight according to these routes, the drone can capture data, transmit information in real time or receive remote instructions from users through the control center. Special features of the present example: • Positioning accuracy: error <5 cm 10 • Navigation accuracy: error <20 cm (maximum deviation from the marked route) • Opensource • Reproducible, applicable to all indoor scenarios • Does not require calibration 15 • Regardless of drone size and operating scenario • No installation required • Portable and flexible service FIG. 1 shows, therefore, the general architecture of the navigation system for the embodiment example. For the development of the navigation system prototype 20 proposed, a drone with open-software, open-hardware autopilot system has been taken as the basis. It incorporates an on-board computer on which runs a Linux operating system and an Ardupilot-based autopilot, with which you can interact through an SDK (however, other operating systems and other programs would be applicable). The drone has the following hardware components: scanner 25 lasers in 20 (x, y) and three distance measuring sensors 110, 112, 114 for height (z axis). On this hardware platform and using ROS (Robot Operating System) as a framework, the functionality for autonomous indoor navigation has been developed, including: a management module 144, a positioning module 140 and a path planning module 142. 30 Positioning module 140 provides the location of the drone with six degrees of freedom and reliably with an accuracy of less than 5 cm. The path planning module 142 is responsible for determining the movement orders for the drone to follow the crossing points (WPs) of the determined route, complying with safety limits and bypassing possible unforeseen obstacles in a safe way. 35 Finally, the management module 144 represents the interface between the user and the drone by means of which the collected data can be accessed and / or sent instructions to the drone in real time. The set of sensors that supports the navigation system for the interior of buildings additionally uses a 102 laser scanner and three distance sensors 5 110, 112, 114 to measure the distance to the floor and ceiling. The drone positioning calculation is preferably based on the Monte Carlo algorithm (20 map-matching), which combined with the data provided by the distance sensors 110, 112, 114 and the inertial measurement unit 116 (IMU, Inertial Management Unit in English) the drone's own allows the integration of the 3D position indoors. 10 To reduce the computational load and develop a system compatible with most on-board computers, the problem has been simplified to 20. The laser scanner 102 scans in 20 and the Monte Carlo method implemented in a first estimator 106 obtains an estimate of 20 Hz of yaw and position in the horizontal plane. Said first estimator 106 compares with the planes stored in 15 a memory 105, the readings of the laser scanner 102 after having been processed with an analyzer 104 which in turn applies scan-matching to compare orientation and position between two data clouds. On the other hand, the estimation of the coordinate (z) is obtained by a second estimator 108 that receives distance readings between the drone and the ground, as well as between the drone and the roof that are measured by the sensors of 20 distance 110, 112, 114. These estimated measurements of Peslimada position (x, y, z, yaw) are integrated directly with the Kalman filter 118 together with the values provided by the on-board IMU 116, that is, angular speeds (vx, vy, vz) and accelerations (ax, ay, az) to provide the estimated positioning in a three-dimensional plane, 6 degrees of freedom (position and rotation in the three axes of the 25 space x, y and z). This solution provides errors of less than 5 cm while optimizing the computational load of the on-board computer. This information is used by the trajectory planning module 142. This module implements in-flight the decision-making process and its execution (take off, travel avoiding collisions and replanning alternative paths and 30 landing). The trajectory planning module 142 controls the orientation ("attitude" in English) and the position of the drone according to the data provided by the positioning module 140, sending directions to the target position (Population), adjusting so Dynamic acceleration and speed depending on: 35 - The distance of the drone to obstacles (laser scanner 20 102). - Current drone location (with six degrees of freedom). - Next WP to be reached on the selected route (scheduled flight plan loaded in memory 105). The drone must reach the determined PObjective. The corrections necessary to achieve it are executed by the flight control and execution unit 128, which 5 increases or decreases engine power. The actual drone position is dynamically updated with reliable data that comes from positioning module 140 (Pestimated). The operation is successive until the end of the route. In turn, the user can receive information from the drone in real time and send instructions (stop it, restart its route, landing or turn it off) through the management module 144. The three 10 modules are presented in greater detail in the following paragraphs: Positioning module 140: Positioning module 140 is responsible for locating the approximate actual position of the drone "± 5 cm" inside an enclosure (e.g. a building). Use a scanner 15 scanning laser 102 operating at a frequency of 40 Hz. In a first step, drone displacement is calculated by odometry, comparing the laser readings at that time with those obtained at the previous instant. For this, an analyzer 104 is used that implements an algorithm of the type Sean Matehing based on ICP techniques (Iterative Closest Point, an algorithm used to minimize the 20 difference between two point clouds) fed by about 1,000 points of the laser scanner and working at a frequency of 20 Hz. Analyzer 104 estimates the drone direction vector (x, y, yaw) based on 2 point clouds read by the laser scanner 102 in two consecutive moments. At every moment, there is a cloud of points referred to the walls, obstacles, etc. of the enclosure or whatever is sweeping the 25 lasers For the flight height control, the three distance sensors 110, 112, 114 are used, preferably based on Time-Of-Flight infrared technology, which offers high precision Zestimated at a very high operating frequency (1 KHz). Two of the distance sensors 110, 112 point to the ground. These two measures add 30 redundancy and thus serve to differentiate temporary obstacles from the plane itself. For example, if the drone flies two meters from the ground and passes over a table and has only one sensor, it will assimilate it as a plane and go up two meters at a stroke. Having two measures (e.g. from port and starboard) that are made with respect to different points on the ground, obstacles are discriminated. since one will indicate an abrupt change of 35 height, while the other will remain constant. When the abrupt change in height occurs in the second sensor, the system identifies that the drone is completely on the table. The third sensor 114 points towards the ceiling and is used as a third reference in case of abnormal situations in which the ground sensors can send some type of erroneous measurement (for example, if the drone passes over a reflective surface that prevents the infrared measurement). 5 Subsequently, the precise calculation of the approximate position that corrects the displacements calculated in the previous step is performed by odometry. The position calculation algorithm used is based on the Monte Carlo probabilistic method and compares the reading of the laser scanner 102 with the planes of the environment loaded in the memory of the drone to obtain (Xeslimada, Yeslimada, yawned), together with the value 10 Zestimated, a reliable estimated position of the drone is available in an indoor space at 20 Hz. The inertial unit 116 of the drone provides acceleration and rotation information at a frequency of 1KHz (every millisecond). These variations allow you to predict your position. To reduce the accumulation of errors, the position is updated with data 15 reliable that come from the comparison of the readings of the laser scanner 102 with the map loaded in memory 105 using a first estimator 106 that applies the Monte Carlo algorithm, which obtains the estimated yaw data and the estimated 3D position of the drone at a 20 Hz frequency. To reduce the accumulation of errors, the position is updated with reliable data (x, y, z, yaw). A first estimator 20 106 applies the Monte Carlo algorithm and obtains the estimated yaw data and the estimated position 20 of the drone at a frequency of 20 Hz, and a second estimator 108 provides a reliable measure of the z-height component also at 20Hz, to complete the position of the drone in 3D. The Kalman 118 filter integrates and combines data from IMU 116 with position data 25 3D thus providing a reliable speed and estimated position Weighed at a frequency of 1KHz (every millisecond). The Kalman filter 118 makes a prediction at 1 KHz with the accelerations x, y, z and the angular velocities x, y, z calculated in the IMU 116.The correction of the Kalman filter is performed using the x, yy yaws obtained with the method Mount it from the first estimator 106 and the heights in z obtained with a 30 second estimator 108 using the height sensors 110, 112, 114. These corrections are made at 20 Hz. Planning module 142 Regarding the trajectory, the planning module 142 allows the drone to navigate safely in an indoor environment where accuracy It is a critical factor. The drone must 35 Being able to make decisions and calculate alternative routes in response to unexpected obstacles for this purpose, the planning module 142 includes a mission control unit 124 and a flight execution and control unit 128. The control unit Mission 124 is in charge of analyzing drone displacement and controlling possible collisions to decide if the route needs to be re-planned or if any additional measures must be implemented. Executing these changes is responsible for 5 flight execution and control unit 128. When an object is within its range, it detects it with its laser scanner 102, identifies it as an obstacle and updates the map of the obstacles used for navigation. Consequently, obstacle detection and route planning is done in 2D, on the scanner plane the laser 102. This 10 planning module 142 is also compatible with a 3D artificial vision camera (not shown in the figure). The incorporation of a 3D camera may be recommended to increase the field of vision and recognize obstacles in 3D. 15 Management module 144 A management and control web application, compatible with mobile devices, allows the user of this system to access the drone from anywhere and at any time. They can access all the information in real time and control their actions in great detail. The functionalities of the control application are 20 are divided into two large groups: -Functionalities of consultation. Visualization of collected data, visualization of the position of the drone on the facilities map, visualization of notifications or alerts and reproduction of finished routes (consultation of historical data). -Management functions. Route management (create / modify / delete rounds, 25 create / modify / delete WPs), route calendar management (create / modify / delete calendar), assign routes to other system drones and route control (start / abort, pause / resume navigation). -Control functionalities. Sending orders to the drone during the routes: Pause and resume travel, go to a specific point, abort route and land, return to the 30 base, etc. The control and communications server located in the facilities where the system is deployed acts as a link between the drones and the control application. He is in charge of maintaining the routes and schedules. Send the necessary orders to the drones so that they fulfill the orders sent by the users from the 35 control application. Maintains a historical record of all tasks performed and receives, processes and records the information collected by the drone. In FIG. 2 illustrates a cyclic process for navigation with three steps, namely: mission control step 202, position control step 204 and flight control step 206. Considering a given route (WPinic; nl -> WPfinnl), one is implemented 5 autonomous navigation with displacement control and obstacle raffle through a cyclic process and can be subdivided into these three levels detailed below. Mission control step 202: It is the highest level. It manages the states of the drone at a high level (takeoff, en route, road replanning and landing) at 40 Hz and activates the 10 operations that involve each state (mainly through mission control unit 124). "Take off" state: ignition of the drone and put in the air. "Route" status: WPs tracking of the selected route. "Re-planning" status: in view of an obstacle between the drone and the next WP a 15 reach, the route is replanned looking for the shortest alternative path to the desired WP. "Landing" mission: controlled drone descent and engine shutdown. It is activated in several cases: - When you have finished a certain route. 20 -When you have been on a replanning mission (looking for alternative paths) spending your maximum search time and you have not been able to reach the WP. For example, if the drone is surrounded by obstacles. -In case of emergency (loss of position due to failure of a sensor, the on-board computer is turned off, etc). 25 Position control step 204: is responsible for managing the high-level orders of mission control step 202. For this purpose, it generates the intermediate Population target positions taking into account the drone's position and speed for a safe flight. In this step, the safety limits such as maximum flight height, minimum safety distance of the drone to obstacles, maximum speed and speed are configured. 30 acceleration or deceleration depending on the distance to obstacle, etc. These safety parameters can be configured depending on the size of the drone, its technology and the operating scenario. This step 206 is mainly associated with the flight control and execution unit 128 that sends the order (arrive at WP2) and determines the increments of displacements and rotations at 40 Hz 35 needed to reach each Population position (x, y, z, yaw) safely. Control step of vol0206 is at a lower level than the previous step, it actsabout the engines that govern the drone movement applying the accelerations andAppropriate spins based on orders received from the mission control unit124 (displacement towards PObjC ~ vO). Here comes the module of5 positioning 140. The error (bias) between the Population is calculated (determined in stepof position control 20), and the Pestimated (determined by the positioning module140). Based on this error, the PID error controller determines the angles ofpitching and warping, as well as the momentum that minimize this error. Subsequentlydetermine the speeds that each of the engines have to apply to 10 to reach said angles and momentum, with the aim of bringing the drone to the desired position at an appropriate speed and acceleration. Industrial application This invention proposes a new solution for indoor navigation. Is 15 applicable to multiple industrial sectors: Manufacturing industry: textile, metal, chemical, hydrocarbons, etc. Naval and industrial construction Production of offshore energy Aeronautics 20 The following are some areas of application aimed at improving processes in the industrial sector: Storage and distribution: Stock control in warehouses and department storesLinear order control in supermarkets 25 Process monitoring and physical surveillance Process control in factories Supervision of elements in industrial environments Control of mining operations Security in public or private buildings 30 Work in adverse environments and conditions Inspection in factories and industrial risk environments: nuclear power plants and thermal, refineries, chemical plants, etc.Wind turbine wing inspectionFuel tank inspection 35 Inspection of ships and offshore platforms 1. Direct applications revolutionize the field of indoor physical surveillance (physical surveillance based on autonomous and intelligent drones that follow their routes in public sports, cultural or critical infrastructure institutions in all industrial sectors). The security agent can watch the drone recordings in use at all times, receive alerts and, if necessary, modify the route of a certain drone or even send more drones to an area because it considers it to be a threatened area. Interoperability with control camera centers results in a dynamic and versatile system, whose service is more efficient and has greater reach (hard-to-reach areas or hostile areas) than traditional physical surveillance services such as CCTV fixed cameras or security guards. security. In addition, its implementation is faster and cheaper because they do not involve wiring installation or hiring of staff 24 hours. 11. Stock management in department stores, incorporating in the video module a barcode recognition or other identifier to carry out the reading of the elements. The drains inside the warehouse, make routes along the corridors, and at all heights, reviewing each level and counting the items in stock by reading a barcode or identifier of each warehouse. This system has great advantages when compared to the current systems that perform these operations, both in the dynamism of the implementation and in the cost and scope of the service. 111. Inspection of large structures such as ship hulls, large turbines or pipes, recognizing cracks or perforations inspecting their quality with a video tool. In this application the drain could even perform a mechanical operation incorporating an additional tool. The advantage of this application is mainly to reduce occupational accidents. IV. Software installation in drains of different technologies and sizes. Adaptability to the application of each sector. Possibility of combining external flights with internal flights in the same UAV, in this case two modules would be combined to cover the entire flight space: - GPS flights (order accuracy of 1-2 m). - Internal flights controlled by this system (precision <5 cm and autonomous and intelligent flights).
权利要求:
Claims (9) [1] 1. Navigation system for an unmanned aerial vehicle in an interior enclosure characterized in that it comprises a positioning module (140) comprising 5 an inertial unit (116) on board and a memory (105);characterized in that the positioning module (140) further comprises aplurality of distance sensors (110,112,114) and a laser scanner (102), wheresaid positioning module (140) is configured to: - estimate on the horizontal plane the position 20 (x, y) and yaw of the vehicle 10, by means of a laser scan performed by the laser scanner (102) and a comparison with a map corresponding to the interior enclosure stored in the memory (105); - estimate the vertical distance of the vehicle with the roof and with the ground or with a possible upper or lower obstacle by means of the plurality of distance sensors 15 (11 0,11 2,114); - combine the estimated 3D position and yaw values of the vehicle with the data provided by the inertial unit (116) and make corrections to the preset route. The navigation system according to claim 1, wherein the positioning module (140) is further configured to analyze the data obtained by the laser scanner (102) by the application of the Sean Matehing algorithm. [3] 3. Navigation system according to claim 1 or 2, wherein the module Positioning (140) comprises a first estimator (106) configured to analyze the data obtained by the laser scanner (102) and the data corresponding to the site map stored in memory (105) by applying the Monte Carlo algorithm to obtain an estimate from position 20 (x, y) and yaw of the aerial vehicle. [4] 4. Navigation system according to any one of claims 1 to 3, wherein the positioning module (140) comprises a second estimator (108) is further configured to analyze the data obtained by a pair of distance sensors (110,112) to estimate redundantly the distance of the air vehicle 35 to the ground. [5] 5. Navigation system according to any one of claims 1 to 4, wherein the positioning module (140) further comprises a Kalman filter (1 18) configured to apply the Kalman algorithm to combine the estimated values of3D position and yaw of the vehicle with the acceleration and speed data of the5 air vehicle provided by the inertial unit (116). [6] 6. Navigation system according to any one of claims 1 to 5, further comprising a planning module (142) configured to receive corrections to the established route of the positioning module (140) and to modify the 10 trajectory of the aerial vehicle. [7] 7. Navigation system according to claim 6, wherein the planning module (142) also includes a flight control and execution unit (128) configured to receive orders to modify the preset trajectory and act on the engines of the air vehicle. [8] 8. Navigation system according to any one of claims 1 to 7, wherein the Positioning module (140) is configured to update the map stored in the memory (105) corresponding to the interior enclosure in accordance with the 20 data of the distance sensors (110,112,114) and of the laser scanner (1 02). [9] 9. Navigation system according to any one of claims 1 to 8, wherein the positioning module (140) further comprises a 3D artificial vision camera. [10] 10. Unmanned aerial vehicle comprising the navigation system according to any one of claims 1 to 9.
类似技术:
公开号 | 公开日 | 专利标题 Saska et al.2017|System for deployment of groups of unmanned micro aerial vehicles in GPS-denied environments using onboard visual relative localization US9983584B2|2018-05-29|Method and apparatus for developing a flight path Droeschel et al.2016|Multilayered mapping and navigation for autonomous micro aerial vehicles US10250821B2|2019-04-02|Generating a three-dimensional model of an industrial plant using an unmanned aerial vehicle WO2017168423A1|2017-10-05|System and method for autonomous guidance of vehicles Meister et al.2008|Adaptive path planning for a vtol-uav US20210318696A1|2021-10-14|System and method for perceptive navigation of automated vehicles US20200301445A1|2020-09-24|Geo-fiducials for uav navigation Sinha et al.2009|Multi uav coordination for tracking the dispersion of a contaminant cloud in an urban region Merz et al.2013|Dependable low‐altitude obstacle avoidance for robotic helicopters operating in rural areas Dowling et al.2018|Accurate indoor mapping using an autonomous unmanned aerial vehicle | Ochoa et al.2017|Fail-safe navigation for autonomous urban multicopter flight Yu et al.2009|Vision-based local multi-resolution mapping and path planning for miniature air vehicles ES2684847B1|2019-07-10|NAVIGATION SYSTEM FOR AN AIR VEHICLE NOT TRIPULATED IN AN INTERNAL ENVIRONMENT Legovich et al.2018|Integration of modern technologies for solving territory patroling problems with the use of heterogeneous autonomous robotic systems Farrell2009|Waypoint generation based on sensor aimpoint Jiang et al.2016|Towards autonomous flight of an unmanned aerial system in plantation forests Prasad et al.2018|Positioning of UAV using algorithm for monitering the forest region Jung et al.2017|Robustness for Scalable Autonomous UAV Operations KR20220031574A|2022-03-11|3D positioning and mapping system and method US11150089B2|2021-10-19|Unmanned aerial vehicle control point selection system Hartman et al.2019|Development of a Velocity Controller for Following a Human Using Target Velocity in a GPS-Denied Environment Mattison et al.2011|An autonomous ground explorer utilizing a vision-based approach to indoor navigation Hartman2018|Development of a Velocity Controller for Following a Human Using Target Velocity in GPS-Denied Environments Zhou et al.2017|On-board sensors-based indoor navigation techniques of micro aerial vehicle
同族专利:
公开号 | 公开日 ES2684847B1|2019-07-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20100084513A1|2008-09-09|2010-04-08|Aeryon Labs Inc.|Method and system for directing unmanned vehicles| EP3103043A1|2014-09-05|2016-12-14|SZ DJI Technology Co., Ltd.|Multi-sensor environmental mapping|
法律状态:
2018-10-04| BA2A| Patent application published|Ref document number: 2684847 Country of ref document: ES Kind code of ref document: A1 Effective date: 20181004 | 2019-07-10| FG2A| Definitive protection|Ref document number: 2684847 Country of ref document: ES Kind code of ref document: B1 Effective date: 20190710 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 ES201730538A|ES2684847B1|2017-03-31|2017-03-31|NAVIGATION SYSTEM FOR AN AIR VEHICLE NOT TRIPULATED IN AN INTERNAL ENVIRONMENT|ES201730538A| ES2684847B1|2017-03-31|2017-03-31|NAVIGATION SYSTEM FOR AN AIR VEHICLE NOT TRIPULATED IN AN INTERNAL ENVIRONMENT| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|